1 research outputs found
DynamicRead: Exploring Robust Gaze Interaction Methods for Reading on Handheld Mobile Devices under Dynamic Conditions
Enabling gaze interaction in real-time on handheld mobile devices has
attracted significant attention in recent years. An increasing number of
research projects have focused on sophisticated appearance-based deep learning
models to enhance the precision of gaze estimation on smartphones. This
inspires important research questions, including how the gaze can be used in a
real-time application, and what type of gaze interaction methods are preferable
under dynamic conditions in terms of both user acceptance and delivering
reliable performance. To address these questions, we design four types of gaze
scrolling techniques: three explicit technique based on Gaze Gesture, Dwell
time, and Pursuit; and one implicit technique based on reading speed to support
touch-free, page-scrolling on a reading application. We conduct a
20-participant user study under both sitting and walking settings and our
results reveal that Gaze Gesture and Dwell time-based interfaces are more
robust while walking and Gaze Gesture has achieved consistently good scores on
usability while not causing high cognitive workload.Comment: Accepted by ETRA 2023 as Full paper, and as journal paper in
Proceedings of the ACM on Human-Computer Interactio